map inference
- South America > Brazil > São Paulo (0.04)
- North America > United States > California > Santa Clara County > Palo Alto (0.04)
- Europe > United Kingdom > England > Greater London > London (0.04)
- Europe > Ireland (0.04)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Search (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Optimization (0.68)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.48)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.48)
The Theory and Practice of MAP Inference over Non-Convex Constraints
Kurscheidt, Leander, Masina, Gabriele, Sebastiani, Roberto, Vergari, Antonio
In many safety-critical settings, probabilistic ML systems have to make predictions subject to algebraic constraints, e.g., predicting the most likely trajectory that does not cross obstacles. These real-world constraints are rarely convex, nor the densities considered are (log-)concave. This makes computing this constrained maximum a posteriori (MAP) prediction efficiently and reliably extremely challenging. In this paper, we first investigate under which conditions we can perform constrained MAP inference over continuous variables exactly and efficiently and devise a scalable message-passing algorithm for this tractable fragment. Then, we devise a general constrained MAP strategy that interleaves partitioning the domain into convex feasible regions with numerical constrained optimization. We evaluate both methods on synthetic and real-world benchmarks, showing our approaches outperform constraint-agnostic baselines, and scale to complex densities intractable for SoTA exact solvers.
- North America > United States > California > San Francisco County > San Francisco (0.14)
- North America > United States > California > Los Angeles County > Los Angeles (0.14)
- Europe > United Kingdom > Wales (0.04)
- (8 more...)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty (0.93)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Optimization (0.93)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.68)
- Information Technology > Data Science (0.67)
Asynchronous Parallel Coordinate Minimization for MAP Inference
Finding the maximum a-posteriori (MAP) assignment is a central task in graphical models. Since modern applications give rise to very large problem instances, there is increasing need for efficient solvers. In this work we propose to improve the efficiency of coordinate-minimization-based dual-decomposition solvers by running their updates asynchronously in parallel. In this case message-passing inference is performed by multiple processing units simultaneously without coordination, all reading and writing to shared memory. We analyze the convergence properties of the resulting algorithms and identify settings where speedup gains can be expected. Our numerical evaluations show that this approach indeed achieves significant speedups in common computer vision tasks.
- Asia > Middle East > Jordan (0.04)
- South America > Paraguay > Asunción > Asunción (0.04)
- North America > United States > Illinois (0.04)
- (3 more...)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Asia > Middle East > Jordan (0.04)
- North America > United States > California > Los Angeles County > Long Beach (0.04)
- Asia > China > Beijing > Beijing (0.04)
- North America > United States > Pennsylvania (0.04)
- North America > Canada > Quebec > Montreal (0.04)
443cb001c138b2561a0d90720d6ce111-Reviews.html
First provide a summary of the paper, and then address the following criteria: Quality, clarity, originality and significance. This paper proposes several approaches to sample from a Gibbs distribution over a discrete space by solving randomly perturbed combinatorial optimization problems (MAP inference) over the same space. The starting point is a known result [5] that allows to do sampling (in principle, using high dimensional perturbations with exponential complexity) by solving a single optimization problem. In this paper they propose to 1) use more efficient low-dimensional random perturbations to do approximate sampling (with probabilistic accuracy guarantees on tree structured models) 2) estimate (conditional) marginals using ratios of partition function estimates, and sequentially sample variables. They propose a clever rejection strategy based on self reduction that guarantees unbiasedness of the samples.
Export Reviews, Discussions, Author Feedback and Meta-Reviews
First provide a summary of the paper, and then address the following criteria: Quality, clarity, originality and significance. The paper gives a nice introduction to sensitivity analysis in graphical models, and proposes a method to check if the MAP configuration will not change, if a simultaneous perturbation in the parameters occurs. The method works by selecting perturbed factors in such a way (using local computations) to create a worst-case scenario. The second best MAP estimation is then identified for the worst case scenario. Theoretically, the complexity is therefore equivalent to the MAP query (exponential in the tree-width), if the number of variables is a constant.
- North America > Canada > Quebec > Montreal (0.04)
- Africa > Cameroon > Gulf of Guinea (0.04)